92 research outputs found

    Human Neonatal Keratinocytes Have Very High Levels of Cellular Vitamin A-Binding Proteins

    Get PDF
    Since cellular retinol- and retinoic acid-binding proteins (CRBP and CRABP) mediate the effects of vitamin A on epidermal differentiation, the levels of these binding proteins were measured in the epidermal and dermal layers of newborn, human foreskin as well as in primary cultures of keratinocytes and fibroblasts from these layers. ligand binding assays w3ith saturating concentrations of all trans-[3H]retinol or of all trans-[11-3H]retinoic acid were used to quantitate amounts of binding proteins in cytosols prepared form these skin layers or cultured cells. The epidermal levels of CRABP and CRBP (60.9 ± 14.4 and 7.3 ± 1.7 pmol per mg cytosol protein, respectively) were markedly higher than that reported for adult epidermis but wer comparable to levels in keratinocytes cultured from neonatal foreskin epidermis (61.8 ± 7.8 and 10.7 ± 2.5, respectively). The levels of CRABP were much lower in the foreskin dermis than in the epidermis and the levels measured in the fibroblasts cultured form this dermis were consistent with the dermal levels. however, CRBP levels in cultured dermal fibroblasts were very low and could not account for the dermal CRBP levels, suggesting that another dermal cell type has high levels of CRBP

    Medication use, renin-angiotensin system inhibitors, and acute care utilization after hospitalization in patients with chronic kidney disease.

    Get PDF
    OBJECTIVES: The aims of this secondary analysis were to: (a) characterize medication use following hospital discharge for patients with chronic kidney disease (CKD), and (b) investigate relationships of medication use with the primary composite outcome of acute care utilization 90 days after hospitalization. METHODS: The CKD-Medication Intervention Trial (CKD-MIT) enrolled acutely ill hospitalized patients with CKD stages 3-5 not dialyzed (CKD 3-5 ND). In this post hoc analysis, data for medication use were characterized, and the relationship of medication use with the primary outcome was evaluated using Cox proportional hazards models. RESULTS: Participants were taking a mean of 12.6 (standard deviation=5.1) medications, including medications from a wide variety of medication classes. Nearly half of study participants were taking angiotensin-converting enzyme (ACE) inhibitors or angiotensin II receptor blockers (ARB). ACE inhibitor/ARB use was associated with decreased risk of the primary outcome (hazard ratio=0.51; 95% confidence interval 0.28-0.95; CONCLUSIONS: A large number, variety, and complexity of medications were used by hospitalized patients with CKD 3-5 ND. ACE inhibitor or ARB use at hospital discharge was associated with a decreased risk of 90-day acute care utilization

    Upper limb prostheses: bridging the sensory gap

    Get PDF
    Replacing human hand function with prostheses goes far beyond only recreating muscle movement with feedforward motor control. Natural sensory feedback is pivotal for fine dexterous control and finding both engineering and surgical solutions to replace this complex biological function is imperative to achieve prosthetic hand function that matches the human hand. This review outlines the nature of the problems underlying sensory restitution, the engineering methods that attempt to address this deficit and the surgical techniques that have been developed to integrate advanced neural interfaces with biological systems. Currently, there is no single solution to restore sensory feedback. Rather, encouraging animal models and early human studies have demonstrated that some elements of sensation can be restored to improve prosthetic control. However, these techniques are limited to highly specialized institutions and much further work is required to reproduce the results achieved, with the goal of increasing availability of advanced closed loop prostheses that allow sensory feedback to inform more precise feedforward control movements and increase functionality

    Reducing the Activity and Secretion of Microbial Antioxidants Enhances the Immunogenicity of BCG

    Get PDF
    BACKGROUND:In early clinical studies, the live tuberculosis vaccine Mycobacterium bovis BCG exhibited 80% protective efficacy against pulmonary tuberculosis (TB). Although BCG still exhibits reliable protection against TB meningitis and miliary TB in early childhood it has become less reliable in protecting against pulmonary TB. During decades of in vitro cultivation BCG not only lost some genes due to deletions of regions of the chromosome but also underwent gene duplication and other mutations resulting in increased antioxidant production. METHODOLOGY/PRINCIPAL FINDINGS:To determine whether microbial antioxidants influence vaccine immunogenicity, we eliminated duplicated alleles encoding the oxidative stress sigma factor SigH in BCG Tice and reduced the activity and secretion of iron co-factored superoxide dismutase. We then used assays of gene expression and flow cytometry with intracellular cytokine staining to compare BCG-specific immune responses in mice after vaccination with BCG Tice or the modified BCG vaccine. Compared to BCG, the modified vaccine induced greater IL-12p40, RANTES, and IL-21 mRNA in the spleens of mice at three days post-immunization, more cytokine-producing CD8+ lymphocytes at the peak of the primary immune response, and more IL-2-producing CD4+ lymphocytes during the memory phase. The modified vaccine also induced stronger secondary CD4+ lymphocyte responses and greater clearance of challenge bacilli. CONCLUSIONS/SIGNIFICANCE:We conclude that antioxidants produced by BCG suppress host immune responses. These findings challenge the hypothesis that the failure of extensively cultivated BCG vaccines to prevent pulmonary tuberculosis is due to over-attenuation and suggest instead a new model in which BCG evolved to produce more immunity-suppressing antioxidants. By targeting these antioxidants it may be possible to restore BCG's ability to protect against pulmonary TB

    Rising rural body-mass index is the main driver of the global obesity epidemic in adults

    Get PDF
    Body-mass index (BMI) has increased steadily in most countries in parallel with a rise in the proportion of the population who live in cities(.)(1,2) This has led to a widely reported view that urbanization is one of the most important drivers of the global rise in obesity(3-6). Here we use 2,009 population-based studies, with measurements of height and weight in more than 112 million adults, to report national, regional and global trends in mean BMI segregated by place of residence (a rural or urban area) from 1985 to 2017. We show that, contrary to the dominant paradigm, more than 55% of the global rise in mean BMI from 1985 to 2017-and more than 80% in some low- and middle-income regions-was due to increases in BMI in rural areas. This large contribution stems from the fact that, with the exception of women in sub-Saharan Africa, BMI is increasing at the same rate or faster in rural areas than in cities in low- and middle-income regions. These trends have in turn resulted in a closing-and in some countries reversal-of the gap in BMI between urban and rural areas in low- and middle-income countries, especially for women. In high-income and industrialized countries, we noted a persistently higher rural BMI, especially for women. There is an urgent need for an integrated approach to rural nutrition that enhances financial and physical access to healthy foods, to avoid replacing the rural undernutrition disadvantage in poor countries with a more general malnutrition disadvantage that entails excessive consumption of low-quality calories.Peer reviewe

    A century of trends in adult human height

    Get PDF
    corecore